Estimating common principal components in high dimensions
نویسندگان
چکیده
منابع مشابه
Detecting influential observations in principal components and common principal components
Detecting outlying observations is an important step in any analysis, even when robust estimates are used. In particular, the robustified Mahalanobis distance is a natural measure of outlyingness if one focuses on ellipsoidal distributions. However, it is well known that the asymptotic chi-square approximation for the cutoff value of the Mahalanobis distance based on several robust estimates (l...
متن کاملFunctional common principal components models
In this paper, we discuss the extension to the functional setting of the common principal component model that has been widely studied when dealing with multivariate observations. We provide estimators of the common eigenfunctions and study their asymptotic behavior.
متن کاملCommon Functional Principal Components 1
Functional principal component analysis (FPCA) based on the Karhunen–Loève decomposition has been successfully applied in many applications, mainly for one sample problems. In this paper we consider common functional principal components for two sample problems. Our research is motivated not only by the theoretical challenge of this data situation, but also by the actual question of dynamics of...
متن کاملEstimating Invariant Principal Components Using Diagonal Regression
In this work we apply the method of diagonal regression to derive an alternative version of Principal Component Analysis (PCA). “Diagonal regression” was introduced by Ragnar Frisch (the first economics Nobel laureate) in his paper “Correlation and Scatter in Statistical Variables” (1928). The benefits of using diagonal regression in PCA are that it provides components that are scale-invariant ...
متن کاملPseudo-Gaussian Inference in Heterokurtic Elliptical Common Principal Components Models
The so-called Common Principal Components (CPC) Model, in which the covariance matrices Σi of m populations are assumed to have identical eigenvectors, was introduced by Flury (1984), who develops Gaussian parametric inference methods for this model (Gaussian maximum likelihood estimation and Gaussian likelihood ratio testing). A key result in that context is the joint asymptotic normality of t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Advances in Data Analysis and Classification
سال: 2013
ISSN: 1862-5347,1862-5355
DOI: 10.1007/s11634-013-0139-1